翻訳と辞書
Words near each other
・ Radiacmea
・ Radiacmea inconspicua
・ Radiacmea intermedia
・ Radiador Magazine
・ Radial
・ Radial (radio)
・ Radial aplasia
・ Radial arm maze
・ Radial arm saw
・ Radial artery
・ Radial artery of index finger
・ Radial artery puncture
・ Radial axle
・ Radial basis function
・ Radial basis function kernel
Radial basis function network
・ Radial chromatography
・ Radial collateral artery
・ Radial collateral ligament
・ Radial collateral ligament of elbow joint
・ Radial collateral ligament of thumb
・ Radial collateral ligament of wrist joint
・ Radial distribution function
・ Radial dysplasia
・ Radial engine
・ Radial force variation
・ Radial fossa
・ Radial fracture
・ Radial function
・ Radial glial cell


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Radial basis function network : ウィキペディア英語版
Radial basis function network
In the field of mathematical modeling, a radial basis function network is an artificial neural network that uses radial basis functions as activation functions. The output of the network is a linear combination of radial basis functions of the inputs and neuron parameters. Radial basis function networks have many uses, including function approximation, time series prediction, classification, and system control. They were first formulated in a 1988 paper by Broomhead and Lowe, both researchers at the Royal Signals and Radar Establishment.〔
==Network architecture==

Radial basis function (RBF) networks typically have three layers: an input layer, a hidden layer with a non-linear RBF activation function and a linear output layer. The input can be modeled as a vector of real numbers \mathbf \in \mathbb^n. The output of the network is then a scalar function of the input vector, \varphi : \mathbb^n \to \mathbb , and is given by
:\varphi(\mathbf) = \sum_^N a_i \rho(||\mathbf-\mathbf_i||)
where N is the number of neurons in the hidden layer, \mathbf c_i is the center vector for neuron i, and a_i is the weight of neuron i in the linear output neuron. Functions that depend only on the distance from a center vector are radially symmetric about that vector, hence the name radial basis function. In the basic form all inputs are connected to each hidden neuron. The norm is typically taken to be the Euclidean distance (although the Mahalanobis distance appears to perform better in general) and the radial basis function is commonly taken to be Gaussian
: \rho \big ( \left \Vert \mathbf - \mathbf_i \right \Vert \big ) = \exp \left(-\beta \left \Vert \mathbf - \mathbf_i \right \Vert ^2 \right ) .
The Gaussian basis functions are local to the center vector in the sense that
:\lim_\rho(\left \Vert \mathbf - \mathbf_i \right \Vert) = 0
i.e. changing parameters of one neuron has only a small effect for input values that are far away from the center of that neuron.
Given certain mild conditions on the shape of the activation function, RBF networks are universal approximators on a compact subset of \mathbb^n. This means that an RBF network with enough hidden neurons can approximate any continuous function with arbitrary precision.
The parameters a_i , \mathbf_i , and \beta_i are determined in a manner that optimizes the fit between \varphi and the data.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Radial basis function network」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.